Beyond Kappa: Estimating Inter-Rater Agreement with Nominal Classifications

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Inter-rater Agreement on Sentence Formality

Formality is one of the most important dimensions of writing style variation. In this study we conducted an inter-rater reliability experiment for assessing sentence formality on a five-point Likert scale, and obtained good agreement results as well as different rating distributions for different sentence categories. We also performed a difficulty analysis to identify the bottlenecks of our rat...

متن کامل

Inter-rater agreement of paramedic rhythm labeling.

STUDY HYPOTHESIS Substantial inter-rater agreement is present in the labeling by paramedics of ventricular fibrillation and asystolic rhythms. DESIGN Prospective, cross-sectional study. TYPE OF PARTICIPANTS One hundred five practicing paramedics from nonvolunteer agencies who are advanced cardiac life support certified. METHODS Five static cardiac arrest rhythm strips, classified by Cummi...

متن کامل

Comparison between inter-rater reliability and inter-rater agreement in performance assessment.

INTRODUCTION Over the years, performance assessment (PA) has been widely employed in medical education, Objective Structured Clinical Examination (OSCE) being an excellent example. Typically, performance assessment involves multiple raters, and therefore, consistency among the scores provided by the auditors is a precondition to ensure the accuracy of the assessment. Inter-rater agreement and i...

متن کامل

Kappa coefficient: a popular measure of rater agreement

In mental health and psychosocial studies it is often necessary to report on the between-rater agreement of measures used in the study. This paper discusses the concept of agreement, highlighting its fundamental difference from correlation. Several examples demonstrate how to compute the kappa coefficient - a popular statistic for measuring agreement - both by hand and by using statistical soft...

متن کامل

Fuzzy kappa for the agreement measure of fuzzy classifications

Fuzzy kappa for the agreement measure of fuzzy classifications Weibei Dou , Yuan Ren, Qian Wu, Su Ruan, Yanping Chen, Daniel Bloyet, JeanMarc Constans Department of Electronic Engineering, Tsinghua University, 100084 Beijing, China GREYC-CNRS UMR 6072, 6 Boulevard Maréchal Juin, 14050 Caen, France cCReSTIC, 9 Rue de Qubec,10026 Troyes, France Imaging Diagnostic Center, Nanfang Hospital Guangzho...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Modern Applied Statistical Methods

سال: 2009

ISSN: 1538-9472

DOI: 10.22237/jmasm/1241136540